A Martingale Which Moves Along a Deterministic Path

Sample Paths
Figure 1: Sample paths

In this post I will construct a continuous and non-constant martingale M which only varies on the path of a deterministic function {f\colon{\mathbb R}_+\rightarrow{\mathbb R}}. That is, {M_t=f(t)} at all times outside of the set of nontrivial intervals on which M is constant. Expressed in terms of the stochastic integral, {dM_t=0} on the set {\{t\colon M_t\not=f(t)\}} and,

\displaystyle  M_t = \int_0^t 1_{\{M_s=f(s)\}}\,dM_s. (1)

In the example given here, f will be right-continuous. Examples with continuous f do exist, although the constructions I know of are considerably more complicated. At first sight, these properties appear to contradict what we know about continuous martingales. They vary unpredictably, behaving completely unlike any deterministic function. It is certainly the case that we cannot have {M_t=f(t)} across any interval on which M is not constant.

By a stochastic time-change, any Brownian motion B can be transformed to have the same distribution as M. This means that there exists an increasing and right-continuous process A adapted to the same filtration as B and such that {B_t=M_{A_t}} where M is a martingale as above. From this, we can infer that

\displaystyle  B_t=f(A_t),

expressing Brownian motion as a function of an increasing process.

Using standard properties of quadratic variations, equation (1) can alternatively be expressed in terms of the quadratic variation,

\displaystyle  [M]_t=\int_0^t1_{\{M_s=f(s)\}}\,d[M]_s. (2)

The motivation for the example given here is that it defies all methods which I have tried to prove the hypotheses of the previous post. That is, if {g\colon{\mathbb R}_+\times{\mathbb R}\rightarrow{\mathbb R}} is such that {g(t,x)} is convex in x and decreasing in t, then is {g(t,M_t)} a semimartingale? In the case where g is twice continuously differentiable, then Ito’s Lemma can be used to write

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle g(t,M_t) &\displaystyle= g(0,M_0)+\int_0^t g_x(s,M_s)\,dM_s+V^+_t+V^-_t,\smallskip\\ \displaystyle V^+_t&\displaystyle = \frac12\int_0^t g_{xx}(s,M_s)\,d[M]_s,\smallskip\\ \displaystyle V^-_t&\displaystyle = \int_0^tg_t(s,M_s)\,ds. \end{array}

This expresses {g(t,M_t)} as a sum of a local martingale and the, respectively increasing and decreasing, processes {V^+} and {V^-}. One approach for non-differentiable functions is to approximate by smooth functions and try to show that the terms {V^+} and {V^-} remain bounded in the limit. Hence, {V=V^++V^-} would converge to an FV process, and {g(t,M_t)} would be a semimartingale. We can compute the expectation {{\mathbb E}[V^+]} in terms of the function {C\colon{\mathbb R}_+\times{\mathbb R}\rightarrow{\mathbb R}}

\displaystyle  C(t,x) = {\mathbb E}\left[(M_t-x)_+\right],

which encodes the one-dimensional marginals of M. Then,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle {\mathbb E}[V^+_T]&\displaystyle= {\mathbb E}\left[\int_0^T g_{xx}(t,M_t)\,d[M]_t\right]\smallskip\\ &\displaystyle=2\int\!\!\!\int_0^T g_{xx}(t,x)C_t(t,x)\,dt\,dx \end{array} (3)

Although there is no guarantee that the time derivative {C_t} exists in a pointwise sense, it can be understood in the sense of distributions. Then, as C is increasing in time, the theory of Lebesgue-Stieltjes integration implies that the integral {\int_0^T\cdot\; C_t\,dt} is well-defined as a finite measure, so the right hand side of (3) is a well-defined integral if g is twice differentiable, and is finite if {g_{xx}} has finite support. Similarly, by convexity in x, the derivative {g_x} is increasing and the integral {\int\cdot\; g_{xx}\,dx} is a well defined measure. So, changing the order of integration, if C is differentiable the right hand side of (3) can be written as {\iint C_t g_{xx}\,dx\,dt}, which is well defined. In general, however, (3) is not well defined. It may be hoped that {\iint\cdot\;C_t\,dt\,dx} is absolutely continuous with respect to x, so that the integral of {g_{xx}} can be defined. However, as the example constructed in this post shows, for each t the measure is singular with respect to x, with a single atom at {x=f(t)}. From (2),

\displaystyle  \iint g_{xx}C_t\,dt\,dx = \iint1_{\{x=f(t)\}}g_{xx}C_t\,dt\,dx.

This rules out any interpretation of (3), as far as I know, for non-differentiable g.


Construction of the Example

We now describe the construction of the martingale. I will just consider a martingale {\{M_t\}_{t\in[0,1]}} with time index t in the interval {[0,1]}, which can be extended to a martingale with time index set {{\mathbb R}_+} by a time change, if desired. Also, the example constructed will have {\lvert M\rvert\le1}. That is, the space-time process {(t,M_t)} will lie in the rectangular region

\displaystyle  R_0=[0,1]\times[-1,1].

We also choose M with {M_0=0} and with {M_1=\pm1} so that {{\mathbb P}(M_1=1)} and {{\mathbb P}(M_1=-1)} are both {1/2}. The idea of the construction is to use self similarity. In each of the following rectangular subregions of {R_0}, {M_t} will, suitably scaled, have the same distribution as the full process.

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle R_1 &\displaystyle=[0,1/4]\times[-1/2,1/2],\smallskip\\ \displaystyle R_2 &\displaystyle=[1/4,1/2]\times[0,1],\smallskip\\ \displaystyle R_3 &\displaystyle=[1/2,3/4]\times[-1,0],\smallskip\\ \displaystyle R_4 &\displaystyle=[3/4,1]\times[-1,1]. \end{array}

Outside of these regions, M will be constant. This is as in Figure 2.

subregions
Figure 2: Self-similar subregions

For {i=1,2,3,4} we define times {t_i=(i-1)/4} and bijective maps {\phi_i\colon R_0\rightarrow R_i} by,

\displaystyle  \setlength\arraycolsep{2pt} \begin{array}{rl} \displaystyle\phi_i(t,x)&\displaystyle=\left(t_i+t/4,u_i(x)\right),\smallskip\\ \displaystyle u_i(x)&\displaystyle=\begin{cases} x/2,&i=1,\\ (x+1)/2,&i=2,\\ (x-1)/2,&i=3,\\ x,&i=4. \end{cases} \end{array}

The distribution of M is defined on the range {t_i\le t\le i/4} as follows; conditional on {M_{t_i}=u_i(0)}, the process {M_{t_i+s/4}} has the same distribution as {u_i(M_s)}, otherwise {M_{t_i+s/4}=M_{t_i}} is constant.

The martingale can be constructed iteratively, starting with a process {\{M^0_t\}_{t\in[0,1]}} satisfying {M^0_0=0} and {{\mathbb P}(M^0_1=1)={\mathbb P}(M^0_1=-1)=1/2}. For each {n\ge0} we do the following,

  • Construct three independent processes {M^{n,1},M^{n,2},M^{n,3}} each with the same distribution as {M^n}, on some probability space.
  • For {0\le t\le1/4} set {M^{n+1}_t=M^{n,1}_{4t}/2}.
  • For {1/4 < t\le1/2} if {M^{n,1}_1=1} set {M^{n+1}_t=(M^{n,2}_{4t-1}+1)/2} otherwise {M^{n+1}_t=-1/2}.
  • For {1/2 < t\le3/4} if {M^{n,1}_1=-1} set {M^{n+1}_t=(M^{n,2}_{4t-2}-1)/2} otherwise {M^{n+1}_t=M^{n+1}_{1/2}}.
  • For {3/4 < t\le 1} if {M^{n,1}_1=-M^{n,2}_1} set {M^{n+1}_t=M^{n,3}_{4t-3}} otherwise {M^{n+1}_t=M^{n,1}_1}.

The distribution of {M_t} for t in the set {\{4^{-n}k\colon 0\le k\le4^n\}} is given by {M^m_t}, which is independent of m over {m\ge n}. Several paths computed with this procedure are shown in Figure 1. This uniquely defines the martingale over the index set of dyadic rationals {\mathbb{D}=\{2^{-n}k\colon 0\le k\le2^n\}}. By standard martingale convergence results, it is possible to choose a version of M which has left and right limits everywhere over {0\le t\le 1}.

Continuity of the sample paths remains to be shown. To do this, fix a positive real {\alpha} and compute the {\alpha}-variation as

\displaystyle  V_n=\sum_{k=1}^{4^n}\left\lvert M_{4^{-n}k}-M_{4^{-n}(k-1)}\right\rvert^\alpha.

Looking at the limit for any sample path as n goes to infinity,

\displaystyle  \liminf_{n\rightarrow\infty}V_n\ge\sum_{t\in[0,1]}\left\lvert\Delta M_t\right\rvert^\alpha.

If it can be shown that {V_n} tends to zero in probability then continuity of M will follow. From the definition, {V_0=1}, and using the self-similarity,

\displaystyle  {\mathbb E}[V_{n+1}]=2{\mathbb E}[2^{-\alpha}V_n]+\frac12{\mathbb E}[V_n].

This is solved by

\displaystyle  {\mathbb E}[V_n]=(2^{1-\alpha}+1/2)^n,

which tends to zero for any {\alpha > 2}. Hence, M is a continuous martingale.

It only remains to construct a function {f\colon[0,1)\rightarrow{\mathbb R}} such that {M_t=f(t)} except on a countable set of intervals where M is constant. To do this, start by defining the sets {S_0=[0,1)\times[-1,1]} and interatively define

\displaystyle  S_{n+1}=\phi_1(S_n)\cup\phi_2(S_n)\cup\phi_3(S_n)\cup\phi_4(S_n).

The set {S_1} consists of the union of the four regions in Figure 2. By construction, on each of the intervals {[t_i,i/4)}, M is either constant or has the same distribution as M on the interval {[0,1)} scaled by the map {\phi_i}. By induction then, for each n, there is a finite union of intervals on which M is constant and, elsewhere, {(t,M_t)} lies inside {S_n}. This is shown in Figure 1, where the shaded region represents {S_6}. Setting

\displaystyle  S_\infty=\bigcap_{n=1}^\infty S_n,

there is a countable collection of intervals on which M is constant and, {(t,M_t)} lies inside {S_\infty} elsewhere. So, we only need to show that {S_\infty} is the graph of a right-continuous function.

For any {t\in[0,1)}, it needs to be shown that there is a unique x with {(t,x)\in S_\infty}. Then, we can take {f(t)=x}. To prove this, write out the base 4 representation of t,

\displaystyle  t=0.a_1a_2a_3 \ldots=\sum_{k=1}^\infty4^{-k}a_k.

The digits {a_k} are integers from 0 to 3, and are uniquely defined so long as we restrict to expansions which do not end with a trailing infinite sequence of 3’s. From the definition of {S_n} it can be seen that the set of x with {(t,x)\in S_n} is

\displaystyle  u_{a_1}\circ u_{a_2}\circ\cdots\circ u_{a_n}([-1,1]).

This is a closed interval of length {2^{-r_n}} where {r_n} is the number of {k\le n} with {a_k\le2}. As the base 4 representation does not end with a trailing sequence of 3’s, {r_n} tends to infinity. So, taking the limit as n goes to infinity, the set of x with {(t,x)\in S_\infty} is a closed interval of zero length and hence is a single point, as required.

Finally, the fact that f is right-continuous follows from the simple fact that {S_n} are left-closed sets in the sense that, for any convergent sequence {(t_k,x_k)\in S_n} with {t_k} decreasing, the limit lies in {S_n}.

3 thoughts on “A Martingale Which Moves Along a Deterministic Path

  1. You write:
    > “such that {g(t,x)} is convex in x and increasing in t, then is”

    I think you meant “decreasing”… [GL: Fixed, thanks!]

  2. Sorry, cannot you be a bit more clear how d[M]ₜ inside E converts to Cₜ(t,Mₜ)Lₜ inside (3).(I’m guessing here, with Lₜ the local time — or maybe its inverse…). It is quite confusing for people who do not do such calculations in their sleeps (yet?)!

    1. Here’s a rough argument: if f\colon\mathbb R_+\times\mathbb R\to\mathbb R is smooth and goes to zero at t=0 and for large t then df(t,M) = f_t(t,M)dt + \frac12 f_{xx}(t,M)d[M]+f_x(t,M)dM. Integrating, taking expectations and removing the martingale term, which has expectation zero, then \int \mathbb E[f_t(t,M)]dt=-\frac12\mathbb E[\int f_{xx}(t,M)d[M]]. As the density of M_t is given by the second derivative of C(t,x) wrt x, then \int\int f_t(t,x)C_{xx}dxdt=-\frac12\mathbb E[\int f_{xx}(t,M)d[M]]. Using some integration by parts, \int\int f_{xx}(t,x)C_{t}dxdt=\frac12\mathbb E[\int f_{xx}(t,M)d[M]]. As f_{xx} can be any arbitrary smooth function of compact support, this gives the result.
      This is just a general formula I have in my mind and have used frequently, so put it in this post without explaining. I have a rigorous proof in Lemma 3.3 here.

Leave a comment